Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Nonlinear AdaBoost algorithm based on statistics for K-nearest neighbors
GOU Fu, ZHENG Kai
Journal of Computer Applications    2015, 35 (9): 2579-2583.   DOI: 10.11772/j.issn.1001-9081.2015.09.2579
Abstract412)      PDF (753KB)(371)       Save
AdaBoost is one of the most popular boosting algorithms in the area of data mining. By analyzing the disadvantages of the traditional AdaBoost using linear combination of the basic classifiers, a new algorithm was proposed, which changed the traditional linear addition into a nonlinear combination, and replaced the constant weights acquired in the training stage by a series of dynamic parameters based on the statistics of the K-nearest neighbors and decided by the instances in the predicting stage. In this way, the weight of each basic classifier was closer to reality. The experimental results show that, compared to the traditional AdaBoost, the new algorithm can increase the prediction accuracy nearly seven percentage points at most. The new algorithm is more accurate and it can achieve higher classification accuracy for most data sets.
Reference | Related Articles | Metrics